Research Overview
We study communications and machine learning (ML) from an information theory perspective. Specifically, we investigate the fundamental limits of source compression (source coding) and information transmission (channel coding), and exploit coding techniques to improve communication efficiency, reliability, and security in scenarios, such as cache-aided networks, distributed computing and learning, etc. Besides, we provide information-theoretic frameworks and joint-source channel coding (JSCC) schemes for task-oriented communications with the guidelines of information theory. For ML, we aim to explain the black-box ML models and design trustworthy (privacy and fairness) ML algorithms via information theory tools.
Network Information Theory
Today wireless systems operate close to their limits and seem unable to satisfy future demands for reliable data rates. Existing theoretical studies suggest that several solutions to this problem consist in introducing relays to assist the transmission, in better exploiting the network side-channels, such as feedback-channels from the receivers to the transmitters or side-channels between transmitters or between receivers, and in facilitating cooperation between the transmitters or the receivers of the network. Most current systems completely ignore these possibilities or exploit them only in a sub-optimal way. The existing theoretical studies, however, are inconclusive–in the sense that they either fail to determine the fundamental limits of reliable data rates, called capacity, that can be achieved with these side channels–or they merely study very specific networks. In this research field, we intend to investigate general wireless networks that incorporate relays, feedback links as well as cooperation communication. Our goal is to develop new coding strategies that improve over the existing coding schemes for such networks, and establish the capacity for certain classes of networks.Communication for Distributed Caching and Computing
In the context of the Internet of Intelligence, extensive computation and transmission of distributed network nodes for processing vast amounts of data raise concerns about communication efficiency, reliability, and privacy. Along with the advancement of artificial intelligence and communication systems, the emerging distributed computing frameworks, such as federated learning, edge learning, and decentralized learning, appeal to new designs to integrate storage, communication, and computing for the better interplay of these resources. Additionally, the concerns related to privacy and security are receiving increasing attention and need to be carefully considered in the design of new systems. For large-scale distributed networks, coding technique has the potential to alleviate communication bottlenecks, improve robustness to network size, and enhance privacy protection for nodes. Our goal is to leverage coding techniques to design efficient, reliable, and private systems.Task-orient Compression and Communication
The convergence of artificial intelligence, digital twins, and wireless networks is driving a paradigm shift from data-centric to intelligence-centric services in 6G. To effectively support diverse intelligent applications with customized requirements, innovative information compression and transmission techniques are essential. These technologies must seamlessly integrate sensing, communication, and computation. Task-oriented communication emerges as a promising approach for 6G, leveraging task-specific information to optimize transmission strategies. Our goal is to design communication efficient and trustworthy federated learning, edge inference, and semantic communications, through the development of advanced information extraction and network resource orchestration techniques.